skip to main content


Search for: All records

Creators/Authors contains: "Han, X."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available July 22, 2024
  2. null (Ed.)
  3. null (Ed.)
    Modern practice for training classification deepnets involves a terminal phase of training (TPT), which begins at the epoch where training error first vanishes. During TPT, the training error stays effectively zero, while training loss is pushed toward zero. Direct measurements of TPT, for three prototypical deepnet architectures and across seven canonical classification datasets, expose a pervasive inductive bias we call neural collapse (NC), involving four deeply interconnected phenomena. (NC1) Cross-example within-class variability of last-layer training activations collapses to zero, as the individual activations themselves collapse to their class means. (NC2) The class means collapse to the vertices of a simplex equiangular tight frame (ETF). (NC3) Up to rescaling, the last-layer classifiers collapse to the class means or in other words, to the simplex ETF (i.e., to a self-dual configuration). (NC4) For a given activation, the classifier’s decision collapses to simply choosing whichever class has the closest train class mean (i.e., the nearest class center [NCC] decision rule). The symmetric and very simple geometry induced by the TPT confers important benefits, including better generalization performance, better robustness, and better interpretability. 
    more » « less
  4. Free, publicly-accessible full text available June 1, 2024
  5. A bstract Charged-lepton-flavor-violation is predicted in several new physics scenarios. We update the analysis of τ lepton decays into a light charged lepton ( ℓ = e ± or μ ± ) and a vector meson ( V 0 = ρ 0 , ϕ , ω , K *0 , or $$ \overline{K} $$ K ¯ *0 ) using 980 fb − 1 of data collected with the Belle detector at the KEKB collider. No significant excess of such signal events is observed, and thus 90% credibility level upper limits are set on the τ → ℓV 0 branching fractions in the range of (1.7–4 . 3) × 10 − 8 . These limits are improved by 30% on average from the previous results. 
    more » « less
    Free, publicly-accessible full text available June 1, 2024
  6. A bstract Using a data sample of 980 fb − 1 collected with the Belle detector at the KEKB asymmetric-energy e + e − collider, we study for the first time the singly Cabibbo-suppressed decays $$ {\Omega}_c^0\to {\Xi}^{-}{\pi}^{+} $$ Ω c 0 → Ξ − π + and Ω − K + and the doubly Cabibbo-suppressed decay $$ {\Omega}_c^0\to {\Xi}^{-}{K}^{+} $$ Ω c 0 → Ξ − K + . Evidence for an $$ {\Omega}_c^0 $$ Ω c 0 signal in the $$ {\Omega}_c^0 $$ Ω c 0 → Ξ − π + mode is reported with a significance of 4 . 5 σ including systematic uncertainties. The ratio of branching fractions to the normalization mode $$ {\Omega}_c^0 $$ Ω c 0 → Ω − π + is measured to be $$ \mathcal{B}\left({\Omega}_c^0\to {\Xi}^{-}{\pi}^{+}\right)/\mathcal{B}\left({\Omega}_c^0\to {\Omega}^{-}{\pi}^{+}\right)=0.253\pm 0.052\left(\textrm{stat}.\right)\pm 0.030\left(\textrm{syst}.\right). $$ B Ω c 0 → Ξ − π + / B Ω c 0 → Ω − π + = 0.253 ± 0.052 stat . ± 0.030 syst . . No significant signals of $$ {\Omega}_c^0\to {\Xi}^{-}{K}^{+} $$ Ω c 0 → Ξ − K + and Ω − K + modes are found. The upper limits at 90% confidence level on ratios of branching fractions are determined to be $$ \mathcal{B}\left({\Omega}_c^0\to {\Xi}^{-}{K}^{+}\right)/\mathcal{B}\left({\Omega}_c^0\to {\Omega}^{-}{\pi}^{+}\right)<0.070 $$ B Ω c 0 → Ξ − K + / B Ω c 0 → Ω − π + < 0.070 and $$ \mathcal{B}\left({\Omega}_c^0\to {\Omega}^{-}{K}^{+}\right)/\mathcal{B}\left({\Omega}_c^0\to {\Omega}^{-}{\pi}^{+}\right)<0.29. $$ B Ω c 0 → Ω − K + / B Ω c 0 → Ω − π + < 0.29 . 
    more » « less